132 research outputs found

    A mathematical resurgence of risk management: an extreme modeling of expert opinions

    Get PDF
    The Operational Risk Advanced Measurement Approach requires financial institutions to use scenarios to model these risks and to evaluate the pertaining capital charges. Considering that a banking group is composed of numerous entities (branches and subsidiaries), and that each one of them is represented by an Operational Risk Manager (ORM), we propose a novel scenario approach based on ORM expertise to collect information and create new data sets focusing on large losses, and the use of the Extreme Value Theory (EVT) to evaluate the corresponding capital allocation. In this paper, we highlight the importance to consider an a priori knowledge of the experts associated to a a posteriori backtesting based on collected incidents.Basel II; operational risks; EVT; AMA; expert; Value-at-Risk; expected shortfall

    A new algorithm for the loss distribution function with applications to Operational Risk Management

    Get PDF
    Operational risks inside banks and insurance companies is currently an important task. The computation of a risk measure associated to these risks lies on the knowledge of the so-called Loss Distribution Function. Traditionally this distribution function is computed via the Panjer algorithm which is an iterative algorithm. In this paper, we propose an adaptation of this last algorithm in order to improve the computation of convolutions between Panjer class distributions and continuous distributions. This new approach permits to reduce drastically the variance of the estimated VAR associated to the operational risks.Operational risk, Panjer algorithm, Kernel, numerical integration, convolution.

    Bringing the Customer Back to the Foreground: The End of Conduct Risk?

    Get PDF
    URL des Documents de travail : http://ces.univ-paris1.fr/cesdp/cesdp2016.htmlDocuments de travail du Centre d'Economie de la Sorbonne 2016.67 - ISSN : 1955-611XIn this chapter, we argue that conduct risk arising from the way financial institutions are conducting business with respect to their customers might be prevented, mitigated and potentially annihilated. Indeed, we believe that data science, proper segmentation, product design and control will lead to a tremendous reduction of conduct rusk exposure and a such these topics are addressed here

    Risk or Regulatory Capital? Bringing distributions back in the foreground

    No full text
    URL des Documents de travail : http://centredeconomiesorbonne.univ-paris1.fr/documents-de-travail/Documents de travail du Centre d'Economie de la Sorbonne 2015.46 - ISSN : 1955-611XThis paper discusses the regulatory requirement (Basel Committee, ECB-SSM and EBA) to measure financial institutions' major risks, for instance Market, Credit and Operational, regarding the choice of the risk measures, the choice of the distributions used to model them and the level of confidence. We highlight and illustrate the paradoxes and the issues observed implementing an approach over another and the inconsistencies between the methodologies suggested and the goal to achieve. This paper make some recommendations to the supervisor and proposes alternative procedures to measure the risks

    Distortion Risk Measures or the Transformation of Unimodal Distributions into Multimodal Functions

    No full text
    URL des Documents de travail : http://ces.univ-paris1.fr/cesdp/cesdp2014.html Chapitre dans "Future Perspectives in Risk Models and Finance", eds. A. Bensoussan, D. Guegan, C. Tapiero, Volume 211 of the series International Series in Operations Research & Management Science, 89-124, 2015Documents de travail du Centre d'Economie de la Sorbonne 2014.08 - ISSN : 1955-611XThe particular subject of this paper, is to construct a general framework that can consider and analyse in the same time upside and downside risks. This paper offers a comparative analysis of concept risk measures, we focus on quantile based risk measure (ES and VaR), spectral risk measure and distortion risk measure. After introducing each measure, we investigate their interest and limit. Knowing that quantile based risk measure cannot capture correctly the risk aversion of risk manager and spectral risk measure can be inconsistent to risk aversion, we propose and develop a new distortion risk measure extending the work of Wang (2000) [38] and Sereda et al (2010) [34]. Finally, we provide a comprehensive analysis of the feasibility of this approach using the S&P500 data set from o1/01/1999 to 31/12/2011.Ce papier propose un cadre général qui permet d'analyser dans le même temps les risques à la hausse et la baisse. Après une revue (avec limites et intérêt) sur les mesures de risques classiques : VaR, ES et mesure spectrale, nous proposons et développons une nouvelle mesure du risque appeler mesure de distorsion qui étend le travail de Wang (2000) et Sereda et al (2010) pour des distributions bimodales et multimodales

    Combining risk measures to overcome their limitations - spectrum representation of the sub-additivity issue, distortion requirement and added-value of the Spatial VaR solution: An application to Regulatory Requirement for Financial Institutions

    No full text
    URL des Documents de travail : http://ces.univ-paris1.fr/cesdp/cesdp2016.htmlDocuments de travail du Centre d'Economie de la Sorbonne 2016.66 - ISSN : 1955-611XTo measure the major risks experienced by financial institutions, for instance Market, Credit and Operational, regarding the risk measures, the distributions used to model them and the level of confidence, the regulation either offers a limited choice or demands the implementation of a particular approach. In this paper, we highlight and illustrate the paradoxes and issues observed when implementing an approach over another, the inconsistencies between the methodologies suggested and the problems related to their interpretation. Starting from the notion of coherence, we discuss their properties, we propose alternative solutions, new risk measures like spectrum and spatial approaches, and we provide practitioners and supervisor with some recommendations to assess, manage and control risks in a financial institution

    The Cascade Bayesian Approach for a controlled integration of internal data, external data and scenarios

    No full text
    URL des Documents de travail : http://centredeconomiesorbonne.univ-paris1.fr/bandeau-haut/document-de-travail/Documents de travail du Centre d'Economie de la Sorbonne 2013.09 - ISSN : 1955-611XAccording to the last proposals of the Basel Committee on Banking Supervision, banks under the Advanced Measurement Approach (AMA) must use four different sources of information to assess their Operational Risk capital requirement. The fourth including "business environment and internal control factors", i.e. qualitative criteria, the three main quantitative sources available to banks to build the Loss Distribution are Internal Loss Data, External Loss Data, and Scenario Analysis. This paper proposes an innovative methodology to bring together these three different sources in the Loss Distribution Approach (LDA) framework through a Bayesian strategy. The integration of the different elements is performed in two different steps to ensure an internal data driven model is obtained. In a first step, scenarios are used to inform the prior distributions and external data informs the likelihood component of the posterior function. In the second step, the initial posterior function is used as the prior distribution and the internal loss data inform the likelihood component of the second posterior. This latter posterior function enables the estimation of the parameters of the severity distribution selected to represent the Operational Risk event types.Suivant les accords de Bâle II, les banques ayant opté pour l'Approche des Mesures Avancées doivent utiliser quatre source d'information pour évaluer leur charge en capital. Le quatrième correspondant " aux facteurs de contrôle interne ", c'est-à-dire un critère d'ordre qualitatif, les trois sources restantes pour construire la distribution de pertes sont les données internes, les données externes et l'analyse de scénarios. Ce papier propose une méthode innovante pour mélanger les données en utilisant une stratégie bayésienne dans l'environnement de la Loss Distribution Approach. L'intégration des différents éléments est réalisée en deux étapes successives pour obtenir un modèle qui soit principalement gouverné par les données internes, ce qui peut être crucial pour les praticiens. Dans une première étape, les scénarios sont utilisés pour informer les priors et les données externes la composante de vraisemblance de la distribution postérieure. Dans une deuxième étape, cette première distribution postérieure est utilisée comme prior et les données internes viennent informées la composante de vraisemblance de la seconde distribution postérieure. Cette dernière permet d'estimer les paramètres des différentes distributions sélectionnées pour modéliser les différents types d'évènements afférant aux risques opérationnels

    A mathematical resurgence of risk management: an extreme modeling of expert opinions

    Get PDF
    URL des Documents de travail : http://ces.univ-paris1.fr/cesdp/cesdp2011.html Publié dans "Frontiers in Finance and Economics", Volume 11, 1, 25-45, 2014Documents de travail du Centre d'Economie de la Sorbonne 2011.57 - ISSN : 1955-611XThe Operational Risk Advanced Measurement Approach requires financial institutions to use scenarios to model these risks and to evaluate the pertaining capital charges. Considering that a banking group is composed of numerous entities (branches and subsidiaries), and that each one of them is represented by an Operational Risk Manager (ORM), we propose a novel scenario approach based on ORM expertise to collect information and create new data sets focusing on large losses, and the use of the Extreme Value Theory (EVT) to evaluate the corresponding capital allocation. In this paper, we highlight the importance to consider an a priori knowledge of the experts associated to a a posteriori backtesting based on collected incidents.L'Approche des Mesures Avancées en Risque Opérationnel requière des institutions financières la mise en place de scénarios pour les modéliser et évaluer la charge en capital afférente. Sachant qu'un groupe bancaire est composé de multiple entités, et que chacune d'entre elles est représentée par un Manager Risque Opérationnel (MRO), nous proposons une approche innovante basée sur l'expertise des MRO pour collecter l'information et créer de nouveaux jeux de données composés de pertes de montant élevé, et nous utilisons la théorie des valeurs extrêmes pour estimer l'allocation en capital correspondante. Dans ce papier, nous soulignons l'importance de bien considérer l'appréhension a priori du risque de la part des experts et de l'associer à une validation a posteriori en utilisant les incidents collectés

    Shapley allocation, diversification and services in operational risk

    Get PDF
    In this paper, a method of allocating operational risk regulatory capital using a closed-form Shapley method, applicable to a large number of business units (BUs), is proposed. It is assumed that if BUs form coalitions, the value added to a coalition by a new entrant is a simple function of the value of that entrant. This function represents the diversification that can be achieved by combining operational risk losses. Two such functions are considered. The calculations account for a service that further reduces the capital payable by BUs. The results derived are applied to recent loss dat
    corecore